Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
98 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

WebQAmGaze: A Multilingual Webcam Eye-Tracking-While-Reading Dataset (2303.17876v3)

Published 31 Mar 2023 in cs.CL

Abstract: We present WebQAmGaze, a multilingual low-cost eye-tracking-while-reading dataset, designed as the first webcam-based eye-tracking corpus of reading to support the development of explainable computational language processing models. WebQAmGaze includes webcam eye-tracking data from 600 participants of a wide age range naturally reading English, German, Spanish, and Turkish texts. Each participant performs two reading tasks composed of five texts each, a normal reading and an information-seeking task, followed by a comprehension question. We compare the collected webcam data to high-quality eye-tracking recordings. The results show a moderate to strong correlation between the eye movement measures obtained with the webcam compared to those obtained with a commercial eye-tracking device. When validating the data, we find that higher fixation duration on relevant text spans accurately indicates correctness when answering the corresponding questions. This dataset advances webcam-based reading studies and opens avenues to low-cost and diverse data collection. WebQAmGaze is beneficial to learn about the cognitive processes behind question-answering and to apply these insights to computational models of language understanding.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (68)
  1. Hollenstein, N., Barrett, M., Beinborn, L.: Towards best practices for leveraging human language processing signals for natural language processing. In: Proceedings of the Second Workshop on Linguistic and Neurocognitive Resources, pp. 15–27. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lincr-1.3 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence classification with human attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Hollenstein et al. [2019] Hollenstein, N., Torre, A., Langer, N., Zhang, C.: CogniVal: A framework for cognitive word embedding evaluation. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pp. 538–549. Association for Computational Linguistics, Hong Kong, China (2019). https://doi.org/10.18653/v1/K19-1050 . https://aclanthology.org/K19-1050 Sood et al. [2020] Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence classification with human attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Hollenstein et al. [2019] Hollenstein, N., Torre, A., Langer, N., Zhang, C.: CogniVal: A framework for cognitive word embedding evaluation. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pp. 538–549. Association for Computational Linguistics, Hong Kong, China (2019). https://doi.org/10.18653/v1/K19-1050 . https://aclanthology.org/K19-1050 Sood et al. [2020] Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Torre, A., Langer, N., Zhang, C.: CogniVal: A framework for cognitive word embedding evaluation. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pp. 538–549. Association for Computational Linguistics, Hong Kong, China (2019). https://doi.org/10.18653/v1/K19-1050 . https://aclanthology.org/K19-1050 Sood et al. [2020] Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  2. Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence classification with human attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Hollenstein et al. [2019] Hollenstein, N., Torre, A., Langer, N., Zhang, C.: CogniVal: A framework for cognitive word embedding evaluation. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pp. 538–549. Association for Computational Linguistics, Hong Kong, China (2019). https://doi.org/10.18653/v1/K19-1050 . https://aclanthology.org/K19-1050 Sood et al. [2020] Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Torre, A., Langer, N., Zhang, C.: CogniVal: A framework for cognitive word embedding evaluation. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pp. 538–549. Association for Computational Linguistics, Hong Kong, China (2019). https://doi.org/10.18653/v1/K19-1050 . https://aclanthology.org/K19-1050 Sood et al. [2020] Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  3. Hollenstein, N., Torre, A., Langer, N., Zhang, C.: CogniVal: A framework for cognitive word embedding evaluation. In: Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pp. 538–549. Association for Computational Linguistics, Hong Kong, China (2019). https://doi.org/10.18653/v1/K19-1050 . https://aclanthology.org/K19-1050 Sood et al. [2020] Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  4. Sood, E., Tannert, S., Frassinelli, D., Bulling, A., Vu, N.T.: Interpreting attention models with human visual attention in machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 12–25. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.2 . https://aclanthology.org/2020.conll-1.2 Brandl and Hollenstein [2022] Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  5. Brandl, S., Hollenstein, N.: Every word counts: A multilingual analysis of individual human alignment with model attention. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 72–77. Association for Computational Linguistics, Online only (2022). https://aclanthology.org/2022.aacl-short.10 Søgaard [2021] Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  6. Søgaard, A.: Explainable Natural Language Processing. Morgan & Claypool Publishers, ??? (2021) Xu et al. [2009] Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  7. Xu, S., Jiang, H., Lau, F.C.M.: User-Oriented Document Summarization through Vision-Based Eye-Tracking. In: Proceedings of the 14th International Conference on Intelligent User Interfaces. IUI ’09, pp. 7–16. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1502650.1502656 . https://doi.org/10.1145/1502650.1502656 Barrett et al. [2016] Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  8. Barrett, M., Bingel, J., Keller, F., Søgaard, A.: Weakly Supervised Part-of-speech Tagging Using Eye-tracking Data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 579–584. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2094 . https://aclanthology.org/P16-2094 Hollenstein and Zhang [2019] Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  9. Hollenstein, N., Zhang, C.: Entity recognition at first sight: Improving NER with eye movement information. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1–10. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1001 . https://aclanthology.org/N19-1001 Hollenstein and Beinborn [2021] Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  10. Hollenstein, N., Beinborn, L.: Relative importance in sentence processing. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 141–150. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-short.19 . https://aclanthology.org/2021.acl-short.19 McGuire and Tomuro [2021] McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  11. McGuire, E., Tomuro, N.: Sentiment analysis with cognitive attention supervision. In: Canadian Conference on AI (2021) Dong et al. [2022] Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  12. Dong, S., Goldstein, J., Yang, G.H.: GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval. In: Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval, pp. 182–192 (2022). https://doi.org/10.1145/3539813.3545129 Eberle et al. [2022] Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  13. Eberle, O., Brandl, S., Pilot, J., Søgaard, A.: Do transformer models show similar attention patterns to task-specific human gaze? In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4295–4309. Association for Computational Linguistics, Dublin, Ireland (2022). https://doi.org/10.18653/v1/2022.acl-long.296 . https://aclanthology.org/2022.acl-long.296 Hollenstein et al. [2020] Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  14. Hollenstein, N., Troendle, M., Zhang, C., Langer, N.: ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation. In: Proceedings of the Twelfth Language Resources and Evaluation Conference, pp. 138–146. European Language Resources Association, Marseille, France (2020). https://aclanthology.org/2020.lrec-1.18 Malmaud et al. [2020] Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  15. Malmaud, J., Levy, R., Berzak, Y.: Bridging information-seeking human gaze and machine reading comprehension. In: Proceedings of the 24th Conference on Computational Natural Language Learning, pp. 142–152. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.conll-1.11 . https://aclanthology.org/2020.conll-1.11 de Leeuw [2015] Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  16. Leeuw, J.R.: jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods 47(1), 1–12 (2015) https://doi.org/10.3758/s13428-014-0458-y Papoutsaki et al. [2016] Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  17. Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: Webgazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. IJCAI’16, pp. 3839–3845. AAAI Press, ??? (2016) Rayner et al. [2012] Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  18. Rayner, K., Pollatsek, A., Ashby, J., Clifton Jr, C.: Psychology of reading (2012) Liversedge and Findlay [2000] Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  19. Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in cognitive sciences 4(1), 6–14 (2000) Rayner [1998] Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  20. Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124(3), 372 (1998) Radach and Kennedy [2004] Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  21. Radach, R., Kennedy, A.: Theoretical perspectives on eye movements in reading: Past controversies, current issues, and an agenda for future research. European journal of cognitive psychology 16(1-2), 3–26 (2004) Winke [2013] Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  22. Winke, P.M.: Eye-tracking technology for reading. The companion to language assessment 2, 1029–1046 (2013) Hollenstein et al. [2018] Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  23. Hollenstein, N., Rotsztejn, J., Troendle, M., Pedroni, A., Zhang, C., Langer, N.: Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific data 5(1), 1–13 (2018) Cop et al. [2017] Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  24. Cop, U., Dirix, N., Drieghe, D., Duyck, W.: Presenting GECO: An eyetracking corpus of monolingual and bilingual sentence reading. Behavior research methods 49, 602–615 (2017) Berzak et al. [2022] Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  25. Berzak, Y., Nakamura, C., Smith, A., Weng, E., Katz, B., Flynn, S., Levy, R.: Celer: A 365-participant corpus of eye movements in L1 and L2 English reading. Open Mind 6, 41–50 (2022) Siegelman et al. [2022] Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  26. Siegelman, N., Schroeder, S., Acartürk, C., Ahn, H.-D., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Expanding horizons of cross-linguistic research on reading: The Multilingual Eye-movement Corpus (MECO). Behavior research methods, 1–21 (2022). Publisher: Springer Desai et al. [2016] Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  27. Desai, R.H., Choi, W., Lai, V.T., Henderson, J.M.: Toward semantics in the wild: activation to manipulable nouns in naturalistic reading. Journal of Neuroscience 36(14), 4050–4055 (2016) Kuperman et al. [2023] Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  28. Kuperman, V., Siegelman, N., Schroeder, S., Acartürk, C., Alexeeva, S., Amenta, S., Bertram, R., Bonandrini, R., Brysbaert, M., Chernova, D., et al.: Text reading in english as a second language: Evidence from the multilingual eye-movements corpus. Studies in Second Language Acquisition 45(1), 3–37 (2023) Hollenstein et al. [2022] Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  29. Hollenstein, N., Barrett, M., Björnsdóttir, M.: The copenhagen corpus of eye tracking recordings from natural reading of Danish texts. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference, pp. 1712–1720. European Language Resources Association, Marseille, France (2022). https://aclanthology.org/2022.lrec-1.182 Dunn et al. [2023] Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  30. Dunn, M.J., Alexander, R.G., Amiebenomo, O.M., Arblaster, G., Atan, D., Erichsen, J.T., Ettinger, U., Giardini, M.E., Gilchrist, I.D., Hamilton, R., et al.: Minimal reporting guideline for research involving eye tracking (2023 edition). Behavior research methods, 1–7 (2023) Holmqvist et al. [2023] Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  31. Holmqvist, K., Örbom, S.L., Hooge, I.T., Niehorster, D.C., Alexander, R.G., Andersson, R., Benjamins, J.S., Blignaut, P., Brouwer, A.-M., Chuang, L.L., et al.: Eye tracking: empirical foundations for a minimal reporting guideline. Behavior research methods 55(1), 364–416 (2023) Hollenstein et al. [2019] Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  32. Hollenstein, N., Barrett, M., Troendle, M., Bigiolli, F., Langer, N., Zhang, C.: Advancing NLP with cognitive language processing signals. arXiv preprint arXiv:1904.02682 (2019) Beinborn and Hollenstein [2023] Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  33. Beinborn, L., Hollenstein, N.: Cognitive Plausibility in Natural Language Processing. Springer, ??? (2023) Mathias et al. [2021] Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  34. Mathias, S., Kanojia, D., Mishra, A., Bhattacharyya, P.: A survey on using gaze behaviour for natural language processing. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4907–4913 (2021) Luke and Christianson [2018] Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  35. Luke, S.G., Christianson, K.: The Provo Corpus: A large eye-tracking corpus with predictability norms. Behavior Research Methods 50(2), 826–833 (2018) https://doi.org/10.3758/s13428-017-0908-4 González-Garduño and Søgaard [2017] González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  36. González-Garduño, A.V., Søgaard, A.: Using Gaze to Predict Text Readability. In: Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 438–443. Association for Computational Linguistics, Copenhagen, Denmark (2017). https://doi.org/10.18653/v1/W17-5050 . https://aclanthology.org/W17-5050 Mishra et al. [2016] Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  37. Mishra, A., Kanojia, D., Nagar, S., Dey, K., Bhattacharyya, P.: Leveraging cognitive features for sentiment analysis. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 156–166. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/K16-1016 . https://aclanthology.org/K16-1016 Barrett et al. [2018] Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  38. Barrett, M., Bingel, J., Hollenstein, N., Rei, M., Søgaard, A.: Sequence Classification with Human Attention. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 302–312. Association for Computational Linguistics, Brussels, Belgium (2018). https://doi.org/10.18653/v1/K18-1030 . https://aclanthology.org/K18-1030 Morger et al. [2022] Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  39. Morger, F., Brandl, S., Beinborn, L., Hollenstein, N.: A cross-lingual comparison of human and model relative word importance. In: Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pp. 11–23. Association for Computational Linguistics, Gothenburg, Sweden (2022). https://aclanthology.org/2022.clasp-1.2 Holmqvist et al. [2012] Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  40. Holmqvist, K., Nyström, M., Mulvey, F.: Eye tracker data quality: What it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 45–52 (2012) Andersson et al. [2010] Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  41. Andersson, R., Nyström, M., Holmqvist, K.: Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3(3) (2010) Ferhat and Vilariño [2016] Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  42. Ferhat, O., Vilariño, F.: Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016, 8680541 (2016) https://doi.org/10.1155/2016/8680541 . Publisher: Hindawi Publishing Corporation Papoutsaki [2015] Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  43. Papoutsaki, A.: Scalable Webcam Eye Tracking by Learning from User Interactions. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. CHI EA ’15, pp. 219–222. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702613.2702627 . https://doi.org/10.1145/2702613.2702627 Sugano et al. [2013] Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  44. Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(2), 329–341 (2013) https://doi.org/10.1109/TPAMI.2012.101 Lu et al. [2014] Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  45. Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10), 2033–2046 (2014) https://doi.org/10.1109/TPAMI.2014.2313123 Xu et al. [2015] Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  46. Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015) Papoutsaki et al. [2017] Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  47. Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In: Proceedings of the 2017 Conference on Conference Human Information Interaction And Retrieval. CHIIR ’17, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170 . https://doi.org/10.1145/3020165.3020170 Dubey et al. [2020] Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  48. Dubey, N., Setia, S., Verma, A.A., Iyengar, S.R.S.: WikiGaze: Gaze-Based Personalized Summarization of Wikipedia Reading Session. In: Proceedings of the 3rd Workshop on Human Factors in Hypertext. HUMAN’20. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3406853.3432662 . https://doi.org/10.1145/3406853.3432662 Valliappan et al. [2020] Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  49. Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., et al.: Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11(1), 4553 (2020) Lin et al. [2022] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  50. Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., Zhou, Y., Zhang, X.: An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control 74, 103521 (2022) https://doi.org/10.1016/j.bspc.2022.103521 Guan et al. [2022] Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  51. Guan, X., Lei, C., Huang, Y., Chen, Y., Du, H., Zhang, S., Feng, X.: An analysis of reading process based on real-time eye-tracking data with web-camera——Focus on English reading at higher education level. In: Proceedings of the 4th Workshop on Predicting Performance Based on the Analysis of Reading Behavior (2022) Hutt et al. [2023] Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  52. Hutt, S., Wong, A., Papoutsaki, A., Baker, R.S., Gold, J.I., Mills, C.: Webcam-based eye tracking to detect mind wandering and comprehension errors. Behavior Research Methods, 1–17 (2023) Artetxe et al. [2020] Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  53. Artetxe, M., Ruder, S., Yogatama, D.: On the cross-lingual transferability of monolingual representations. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4623–4637. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.421 . https://aclanthology.org/2020.acl-main.421 Rajpurkar et al. [2016] Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  54. Rajpurkar, P., Zhang, J., Lopyrev, K., Liang, P.: SQuAD: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2383–2392. Association for Computational Linguistics, Austin, Texas (2016). https://doi.org/10.18653/v1/D16-1264 . https://aclanthology.org/D16-1264 Schütze et al. [2008] Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  55. Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval vol. 39. Cambridge University Press Cambridge, ??? (2008) Gureckis et al. [2016] Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  56. Gureckis, T.M., Martin, J., McDonnell, J., Rich, A.S., Markant, D., Coenen, A., Halpern, D., Hamrick, J.B., Chan, P.: psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods 48(3), 829–842 (2016) https://doi.org/10.3758/s13428-015-0642-8 Salvucci and Goldberg [2000] Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  57. Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78 (2000) Just and Carpenter [1980] Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  58. Just, M.A., Carpenter, P.A.: A theory of reading: from eye fixations to comprehension. Psychological review 87(4), 329 (1980) DeYoung et al. [2020] DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  59. DeYoung, J., Jain, S., Rajani, N.F., Lehman, E., Xiong, C., Socher, R., Wallace, B.C.: ERASER: A benchmark to evaluate rationalized NLP models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4443–4458. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.408 . https://aclanthology.org/2020.acl-main.408 Wiegreffe and Marasovic [2021] Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  60. Wiegreffe, S., Marasovic, A.: Teach me to explain: A review of datasets for explainable natural language processing. In: Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1) (2021) Chiang and Lee [2022] Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  61. Chiang, C.-H., Lee, H.-Y.: Re-Examining Human Annotations for Interpretable NLP (2022) Shubi and Berzak [2023] Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  62. Shubi, O., Berzak, Y.: Eye movements in information-seeking reading. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol. 45 (2023) Birawo and Kasprowski [2022] Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  63. Birawo, B., Kasprowski, P.: Review and evaluation of eye movement event detection algorithms. Sensors 22(22), 8810 (2022) Nuraini et al. [2021] Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  64. Nuraini, A., Murnani, S., Ardiyanto, I., Wibirama, S.: Machine learning in gaze-based interaction: A survey of eye movements events detection. In: 2021 International Conference on Computer System, Information Technology, and Electrical Engineering (COSITE), pp. 150–155 (2021). IEEE Devlin et al. [2019] Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  65. Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://aclanthology.org/N19-1423 Wisiecka et al. [2022] Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  66. Wisiecka, K., Krejtz, K., Krejtz, I., Sromek, D., Cellary, A., Lewandowska, B., Duchowski, A.: Comparison of webcam and remote eye tracking. In: 2022 Symposium on Eye Tracking Research and Applications, pp. 1–7 (2022) Ikhwantri et al. [2023] Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  67. Ikhwantri, F., Putra, J.W.G., Yamada, H., Tokunaga, T.: Looking deep in the eyes: Investigating interpretation methods for neural models on reading tasks using human eye-movement behaviour. Information Processing & Management 60(2), 103195 (2023) Schneegans et al. [2021] Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021) Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
  68. Schneegans, T., Bachman, M.D., Huettel, S.A., Heekeren, H.: Exploring the potential of online webcam-based eye tracking in decision-making research and influence factors on data quality (2021)
Citations (4)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com